On the weighted approximation of continuously differentiable functions
نویسندگان
چکیده
منابع مشابه
Modelling with twice continuously differentiable functions ∗
Many real life situations can be described using twice continuously differentiable functions over convex sets with interior points. Such functions have an interesting separation property: At every interior point of the set they separate particular classes of quadratic convex functions from classes of quadratic concave functions. Using this property we introduce new characterizations of the deri...
متن کاملMinimal Approximate Hessians for Continuously Gâteaux Differentiable Functions
In this paper, we investigate minimal (weak) approximate Hessians, and completely answer the open questions raised by V. Jeyakumar and X. Q. Yang. As applications, we first give a generalised Taylor’s expansion in terms of a minimal weak approximate Hessian. Then we characterise the convexity of a continuously Gâteaux differentiable function. Finally some necessary and sufficient optimality con...
متن کاملRemarks on Lp-approximation of infinitely differentiable multivariate functions
We study the Lp-approximation problem (1 ≤ p < ∞) for infinitely differentiable d-variate functions with respect to the worst case error. In particular, we correct a mistake in the argumentation of Novak and Woźniakowski [2], who showed that the problem is intractable. The main ingredients are arguments from convex geometry, as well as a probabilistic calculation.
متن کاملOn the Set of Fixed Points and Periodic Points of Continuously Differentiable Functions
The set of periodic points of self-maps of intervals has been studied for different reasons. The functions with smaller sets of periodic points are more likely not to share a periodic point. Of course, one has to decide what “big” or “small” means and how to describe this notion. In this direction one would be interested in studying the size of the sets of periodic points of self-maps of an int...
متن کاملContinuously Differentiable Exponential Linear Units
Exponential Linear Units (ELUs) are a useful rectifier for constructing deep learning architectures, as they may speed up and otherwise improve learning by virtue of not have vanishing gradients and by having mean activations near zero [1]. However, the ELU activation as parametrized in [1] is not continuously differentiable with respect to its input when the shape parameter α is not equal to 1...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the American Mathematical Society
سال: 1991
ISSN: 0002-9939
DOI: 10.1090/s0002-9939-1991-1037217-6